Smooth Convex Optimization Using Sub-Zeroth-Order Oracles

نویسندگان

چکیده

We consider the problem of minimizing a smooth, Lipschitz, convex function over compact, set using sub-zeroth-order oracles: an oracle that outputs sign directional derivative for given point and direction, compares values pair points, noisy value point. show sample complexity optimization these oracles is polynomial in relevant parameters. The algorithm we provide comparator first with known rate convergence number dimensions. also give noisy-value incurs sublinear regret queries

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Zeroth-Order Stochastic Convex Optimization via Random Walks

We propose a method for zeroth order stochastic convex optimization that attains the suboptimality rate of Õ(n7T−1/2) after T queries for a convex bounded function f : R → R. The method is based on a random walk (the Ball Walk) on the epigraph of the function. The randomized approach circumvents the problem of gradient estimation, and appears to be less sensitive to noisy function evaluations c...

متن کامل

Convex Optimization with Nonconvex Oracles

In machine learning and optimization, one often wants to minimize a convex objective function F but can only evaluate a noisy approximation F̂ to it. Even though F is convex, the noise may render F̂ nonconvex, making the task of minimizing F intractable in general. As a consequence, several works in theoretical computer science, machine learning and optimization have focused on coming up with pol...

متن کامل

Stochastic Zeroth-order Optimization in High Dimensions

We consider the problem of optimizing a high-dimensional convex function using stochastic zeroth-order queries. Under sparsity assumptions on the gradients or function values, we present two algorithms: a successive component/feature selection algorithm and a noisy mirror descent algorithm using Lasso gradient estimates, and show that both algorithms have convergence rates that depend only loga...

متن کامل

Efficient Convex Optimization with Membership Oracles

We consider the problem of minimizing a convex function over a convex set given access only to an evaluation oracle for the function and a membership oracle for the set. We give a simple algorithm which solves this problem with Õ(n) oracle calls and Õ(n) additional arithmetic operations. Using this result, we obtain more efficient reductions among the five basic oracles for convex sets and func...

متن کامل

Stochastic first order methods in smooth convex optimization

In this paper, we are interested in the development of efficient first-order methods for convex optimization problems in the simultaneous presence of smoothness of the objective function and stochasticity in the first-order information. First, we consider the Stochastic Primal Gradient method, which is nothing else but the Mirror Descent SA method applied to a smooth function and we develop new...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i5.16499